PyDigger - unearthing stuff about Python


NameVersionSummarydate
flash-attn 2.7.0.post2 Flash Attention: Fast and Memory-Efficient Exact Attention 2024-11-13 23:53:29
causal-conv1d 1.4.0 Causal depthwise conv1d in CUDA, with a PyTorch interface 2024-06-29 02:27:20
quant-matmul 1.2.0 Quantized MatMul in CUDA with a PyTorch interface 2024-03-20 03:44:36
fast-hadamard-transform 1.0.4.post1 Fast Hadamard Transform in CUDA, with a PyTorch interface 2024-02-13 05:49:17
flash-attn-wheels-test 2.0.8.post17 Flash Attention: Fast and Memory-Efficient Exact Attention 2023-08-13 21:27:09
flash-attn-xwyzsn 1.0.7 Flash Attention: Fast and Memory-Efficient Exact Attention 2023-06-01 03:53:40
Tri Dao
hourdayweektotal
38146010358263914
Elapsed time: 5.28332s